Latest news with #data privacy


Telegraph
9 hours ago
- Business
- Telegraph
HMRC sacks dozens of staff for snooping on taxpayers
Have you worked or do you work at HMRC? Tell us about a data breach in confidence at: money@ Information that needs a higher level of security can be submitted here. Please see our Privacy Notice. HM Revenue and Customs sacked 50 workers last year for breaking data privacy rules and snooping on taxpayers' records. In total, 354 tax employees have been disciplined for data security breaches since 2022, of whom 186 have been fired, The Telegraph can reveal. The tax office admitted that some were dismissed for looking up taxpayers' confidential information. HMRC holds a vast amount of sensitive data such as addresses, salaries and National Insurance numbers. Staff are forbidden from looking up these details unless they have a genuine business reason. Despite the warnings, a number of employees have been caught accessing unauthorised accounts using HMRC's IT systems. In 2024-25, 96 staff were disciplined for data security breaches, of whom 50 were later dismissed, according to data obtained by The Telegraph via a Freedom of Information request. HMRC said this represented less than 0.1pc of its nearly 68,000 staff. The numbers have fallen since last year when 138 employees were disciplined, and 68 were given the sack. The figures covered all data security breaches and not just staff searching for taxpayers' records. Other examples of data breaches include making changes to records without authorisation, losing sensitive documents or failing to securely dispose of inadequately protected devices. In one incident in 2023, an employee was sacked from HMRC after sending the data of 100 individuals to his personal email address. According to court documents, the staff member was visiting a business as part of a compliance check when he emailed himself a PDF containing a list of staff members' details – including their salaries and National Insurance numbers – and printed it off for the meeting using his home computer. The incident was flagged to his line manager by the analytics team responsible for identifying data breaches and he was dismissed for gross misconduct following an investigation. The worker took HMRC to an employment tribunal, arguing he had not been thinking straight at the time due to anxiety. However, the tribunal dismissed his claim for wrongful dismissal. Data breaches like this have been on the rise since the pandemic because of remote working, according to one HMRC manager cited in the tribunal. In an email reminding staff never to send personal data outside the tax office's systems, the line manager of the claimant wrote: 'There have been more incidents of this recently as we are working from home a lot more since Covid, but never send anything to your own private email address to print off that contains any personal or business data.' Former HMRC inspectors said the importance of data security was drilled into employees from day one. Ronnie Pannu, of advice firm Pannu Tanu, said: 'When I was in HMRC, there was always a strong message from above that viewing a taxpayer's records where this was not necessary for a particular purpose was a serious issue which could have serious consequences for the individual concerned.' John Hood, of accountants firm Moore Kingston Smith, said: 'Any HMRC employee foolish enough to look up personal information that is not part of their usual responsibilities faces a ticking time bomb as most searches are tracked. 'As an additional security, some parts of the system are restricted so that only specifically authorised personnel can access them, such as the departments dealing with MPs and civil servants.' All staff receive mandatory training on data security, and HMRC restricts access so workers can only look up customer records if this is needed for their specific role. In addition, the tax office tracks staff activity on its systems to deter misuse and record breaches. Employees who break the rules are investigated and will face penalties, with each incident considered on a case-by-case basis. Ellen Milner, of the trade body the Chartered Institute of Taxation, said: 'Taxpayers have to be able to trust that the private information they provide to HMRC will not be leaked, supplied to criminals or used for any purpose other than that for which it was provided, and in accordance with the law. 'That is why HMRC treats unauthorised access to records and data so seriously, and it is good to see that where breaches happen, HMRC will act.' On the social network, Reddit, users who claim to work at the Civil Service say they have seen new recruits at HMRC and the Department of Work and Pensions (DWP) fired after looking up their own records or those of friends and celebrities out of curiosity. In 2021, an administrative officer was sacked from DWP after looking up her neighbour's address using Searchlight, a database containing information about almost everyone in the country. According to the court documents, DWP workers are explicitly told not to look up themselves, families or celebrities. A DWP employee present at the tribunal described it as 'the number one rule' impressed upon everyone who joined the department. Serious data breaches must be reported to the Information Commissioner's Office. HMRC's annual report shows that there were six incidents last year of employees changing customer records without permission, and two of staff losing inadequately protected devices. HMRC is under growing pressure to strengthen its data security as criminal attacks grow more sophisticated and as it shifts towards becoming a digital-first organisation. It recently emerged that 100,000 taxpayers had been affected by phishing attacks in the past year. Criminals used stolen credentials to access taxpayers' accounts and claim significant sums in rebates. There was no loss to the individuals, who have since been contacted, and a number of arrests were made – but the cost to the taxpayer was £47m. A spokesman for HMRC said: 'Instances of improper access are extremely rare, and we take firm action when it does happen, helping prevent a recurrence. 'We take the security of customers' data extremely seriously and we have robust systems to ensure staff only access records when there is a legitimate business need.'


New York Times
3 days ago
- New York Times
Are Your Smart Devices Really Spying on You?
CAIRA: Okay. Now we know what data they're collecting. Can you tell us who is this amorphous they? Who is actually collecting the data? JON: It's a guy named Gary. CHRISTINE: I'm Christine Cyr Clisset. CAIRA: I'm Caira Blackwell. ROSIE: I'm Rosie Guerin and you're listening to The Wirecutter Show . ROSIE: Hi friends. CHRISTINE: Hey there. CAIRA: Hi. ROSIE: So we are talking about data privacy on the show today and I've actually been thinking and kind of worrying a lot about it recently and maybe increasing since having kids. Okay, let me explain. I was in the car the other day and my wife texted me, sent me a text message. And so being the good safe driver I am, I had Siri play it. And My wife must have had the new Siri AI thing enabled because not only did it transcribe the words she wanted to text to me, it started transcribing or describing the photo she sent, which happened to be of my two children. And she's describing their features and their faces through the, I was like, cancel. Cancel. She being the AI. She being Siri. Yes. Was she accurate? Not really, but I assume they're learning, right? And so that really gave me pause, and it started me thinking about the age we live in with these two polars on the one hand, all we have been able to achieve from technology, from our lives being very connected, and on the other hand, relinquishing control of our data and therefore our price. CAIRA: Yeah. CHRISTINE: Yeah, I mean, it's happening around us all the time. I think we all kind of understand, or most of us understand, that when we're using things on the internet, we're kind of relinquishing a little bit of our information every time we do that. We actually published recently a bunch of articles around data privacy and security, and one of them we're gonna talk about today, we're going to bring on Jon Chase, who is our supervising editor of Smart Home Coverage at Wirecutter. And his team did this pretty intense deep dive into looking at... The data that these different devices are collecting about us. It's not surprising if you have a quote unquote smart device like a smart speaker that it's collecting data on you. I know that some people choose not to get these devices because of that. But what this team found is that a lot of devices in our homes that we may not think of as smart devices are actually collecting quite a lot of data and it's pretty up in the air about what's actually happening with this data. CAIRA: So good. I'm so glad that we gave up our freedoms for this convenience. Well, but like, what's the re- ROSIE: You know, it's tough because what's the recourse? Going and living off the grid, which actually kind of sounds really nice. Learn how to farm. But we're in this, I think. CAIRA: Okay, so after the break, we're going to talk with Jon about which devices are spying on you, what they're looking for, and how to protect your data. Be right back. CAIRA: Welcome back. With us now is Jon Chase. He's a supervising editor on the tech team who covers smart home devices for Wirecutter. He's also been writing about tech for over two decades. A fun fact about Jon is that in addition to his impressive career in journalism, he's also worked as a TV writer for several game shows, including my favorite Cash Cab and Who Wants to Be a Millionaire. CHRISTINE: Jon, welcome to the show. JON: Happy to be here. ROSIE: That's incredible. And also, very much tracks. CHRISTINE: Yeah. CAIRA: Yeah. I want to know- CHRISTINE: You better say something funny on this episode. CAIRA: Oh. I just want to know if you got to ride in the Cash Cab. JON: We followed the Cash Cab, keeping up and making sure we could manage it. ROSIE: Ah. CHRISTINE: All right. CAIRA: Okay. Well, Jon, right off the bat, I want to know which of my smart devices are spying on me because I know it has to be more than just Alexa and Amazon. JON: First off, we should tell anyone who's listening to this to mute their smart speakers because they're going to be triggered in more ways than one. I would say spy is a very charged term. CAIRA: Okay. JON: You might just say they are paying attention, close attention. Yeah. We like to say that data is the fuel of the smartphone. These devices, all of the amazing things they can do, they fully depend on creating and collecting data, and synthesizing it, and sending it to the cloud. That's just table stakes. You can't really get around that. I don't think anyone would be surprised that smart home devices are collecting data. That's what they do. But I definitely think there's a few that we encountered that are doing a whole lot more than we suspected. Smart TVs were pretty egregious. CHRISTINE: I often think of a smart home device as a smart speaker, something that has smart in the title, a smart thermostat or something. But what are the devices we're really talking about in the home? JON: Yeah. A lot of times, there's devices that are just Wi-Fi connected, which aren't necessarily smart. We've always defined it as any device that has a control app, has the ability to be accessed remotely, and connects to the internet. Usually, we bias towards ones that can be controlled using a third party platform, which many people are familiar with if you have Google Home, Amazon Alexa, Apple Home. That's the broad definition. But you've got your smart speakers, you've got your smart light bulbs. Thermostats, like your Nest and Ecobee, things like that. There's also a lot of kitchen appliances have for a long time ... Amazon put out an Alexa-powered microwave. CAIRA: Why? JON: And an Alexa-powered clock. There's a lot of stuff out there that may not even be labeled smart, but has the ability to be connected. ROSIE: Until I have a robot that can take my food and put it into the microwave, I don't understand the purpose of being able to be far away and having Alexa turn on my microwave. JON: First of all, that's what kids are for. You tell the kids to put the stuff in the microwave. ROSIE: Fair point. JON: I get that. But I think one of the things, I sometimes feel like I'm a smart home apologist .. but really it's problem solving. There's a lot of people ... One of the things we've really learned in the past few years is with the accessibility community, people who have mobility issues, things like that, a bulb that goes on and off at a set time for someone who has the inability to turn off light bulbs is a godsend. CHRISTINE: Yeah. Turn fans on when it's hot. JON: Yeah. CHRISTINE: Turn your AC on. JON: Change the temperature, yeah. CHRISTINE: Yeah, there's definitely some real, real use cases that benefit. ROSIE: It's life-changing, yeah. CHRISTINE: Yeah, it can be really life-transforming. JON: Then, back to what I said before, all of that depends on data. CHRISTINE: I know a bunch of people who refuse to get a smart speaker because they are concerned with these devices listening in on them or collecting their data. You just listed many devices, many other types of devices, things that people might not think of as smart devices. What exactly is the type of data that they're collecting? JON: There's personal data type stuff, which I'll talk about in a second, and then there's also just functional data type stuff. I'll give the example of a thermostat. A smart thermostat, it's checking the temperature nearby. Some of them have a motion sensor. Some of them also have a presence sensor, and they might even have other sensors that are remote. And it also connects to the internet, and it learns over time if the temperature is X degrees and it is this inside, and the weather is this, it'll take this amount of time for your heat to fully heat or cool your house. That's functional data. Then on top of that though, it may have your address. If you pair it with other devices, then those devices- CAIRA: They talk to each other? JON: They talk to each other. And if you connect it to a third party service, say like Amazon Alexa or Google Home, or something like that, then things start to get, I will use the technical term, hinky because it becomes really, really confusing. I think the issue we're all going to be talking about here is just how no mere mortal has the capacity to really gauge what's going on. CHRISTINE: Right. None of us really know how much of our personal or situational data is out there at this point. JON: That's right. The smart speakers is probably the most obvious example of people getting skeeved out, another technical term. There's some truth, and then there's also a lot of anecdotal, weird stuff that happens and I think it colors the whole experience. Blankly, I'll just state we spoke with all these companies, we've tested these things for years. I've spoken with, there's this great researcher that works at Georgia Tech who has tested all these devices. They give you a signal when they're listening and they're always listening. I'll just use Alexa as an example because it's the foremost example. An Alexa is always listening, but it's literally listening for a particular wave form, a vocal code, and that's what's called the wake word. You can change it, it's Alexa by default, but you can say "computer" or you can say "Echo," that kind of stuff. Then eventually, it'll hear that tone and it perks up. And it signals that it's perked up. There's a light, so you know visually that it's happened. If it's a wrong thing, it'll just fade and delete the recording. If it's correct and you actually are communicating with it, it will interact with you and that kind of stuff. That does go to the cloud. Now, depending on your settings, we can talk about what that actually means. You can opt to have that recording saved or not. Depending on the platform, meaning the device, you can also decide whether it records or not. For instance, Google speakers don't actually save recordings by default, which was a great surprise to us. CAIRA: I feel like a lot of people might think that an Alexa speaker is listening literally all the time. You hear people say, "I was talking with my friend about a toothbrush that I really wanted, and then next thing you know, I get a toothbrush ad, so it must be listening to me all the time." It sounds like that's not actually happening. JON: Yeah. There is a phenomenon of what you're talking about. I don't know if there's a name for it, but there is a thing where people are like, "Oh, we were talking about Aruba," or a baseball bat, or some kind of thing like that, and then it shows up in your feed. My understanding, I spoke with a whole bunch of people about this, your device, almost all of us could walk around with a smartphone with us. When you connect to the internet, you have what's called an IP address which is specific to you. If you are connected to Wi-Fi, and someone else is connected to Wi-Fi, and someone else is connected to Wi-Fi, you become associated. Then you might also travel to other places and you might also search for certain things. Suddenly, it just all becomes algorithmic. There's basically an association, the data profiles of people. These live on your phones, they live in your laptops. These devices collect information on you, on your search habits, your location, things like that. They allegedly get anonymized, but ads are served to you based on those things. If there's an affinity, if you are around other people, they might be like, "Oh, okay, we think she's a white woman who's 21 obviously." CHRISTINE: Obviously. JON: Yes. She traveled to Florida, and also does this and does this, and they might serve you the same ads. It might be that any one of you might have searched for something recently and that ends up being the trigger. CHRISTINE: This happened to me recently because Caira showed me this swimming pool she went to and it popped up on my Instagram feed. CAIRA: Really? CHRISTINE: I was like, "I've never searched this." JON: But I'll bet you ... You searched it, right? CAIRA: Probably to show Christine. JON: But you searched it. CAIRA: Yeah. JON: And you're affiliated with her because of your address, so therefore there's a pretty good, better than not chance that this would interest her. CHRISTINE: Yeah. It wasn't because we vocally talked about it- JON: Nothing to do with that. CHRISTINE: ... together in this studio. JON: Yes. CHRISTINE: It was because she had searched for it on her phone and our phones were in the same room. JON: That's right. CHRISTINE: That's so creepy. JON: But also, you guys probably gallivant and you're associated in multiple places. That gives you, "Oh, well, she likes nice restaurants." You go clubbing a lot I know. All that stuff, it adds up to this profile. CAIRA: The data profile. CHRISTINE: Yeah. JON: The data profile, that's where you start getting really icky. ROSIE: Jon, you talked about the functional data that's being collected. What other kind of data is being collected from these smart devices? JON: Sure. As we talked about, when you're setting up a device, you use an app. The app will probably ask you, depending what the device is, it might say, "What's your address? Because I need to know that for geolocation," which is- CHRISTINE: Where you are on the Earth. JON: Where you are on the Earth and that's used for a lot of really cool functions. You can have things turn on and off when you leave and come home from your house, or something like that. It might be email, phone number. You might have billing associated with it, you might have a credit card. But there's also stuff like there's your IP address. That is not necessarily personal, but it's one of those things that once it becomes associated with you, all of the IP addresses that you travel around the world and connect to all become part of a profile, and they make you more and more findable. You also might have health data. CHRISTINE: I was going to ask you about health data. I have an Apple Watch, I work out with my Apple Watch. I've started using Apple Fitness, and the fitness app and my Apple watch are integrated. When you're using a device like that, then presumably the data that is going into your profile, like things like your weight, and other metrics that are in there, right? JON: Yes. I will say Apple actually has really good policies around that. But if you have a non-Apple thing like a headphone- CHRISTINE: Yeah. I sometimes use my Soundcore headphones with my Apple Watch. JON: Right. Those can, there are no rules really around health data. Any app ... A lot of headphones now have cardiac monitors, and sleep monitors, all this kind of stuff. They actually can access your health data and they have willy-nilly access to it. They can do whatever they want with it. HIPAA is… it doesn't actually protect people nearly as much as they think it does. CAIRA: Okay. Now we know what data they're collecting. Can you tell us who is this amorphous they? Who is actually collecting the data? JON: It's a guy named Gary. ROSIE: C'mon, Gary. JON: Yeah. The companies themselves. One of the things we learned is large companies actually tend to be much more trustworthy than small companies. Not out of any sense of malice, but because a lot of times, smaller companies simply do not have the technical chops to do the security testing. CAIRA: When things fall through the cracks, who is it going to? Who's buying it? JON: Oh, yeah. Great question. There's companies that own it themselves. People talk about Amazon. Amazon is a giant sales company. They want data about you to put products your way and they have millions of partners. They state equivocally, "We do not sell data that is collected," but they're still using it. But also, Google. Google is an advertising company. They also help make ads for other companies. I spoke with someone who had been a higher level product specialist about this stuff. He was like, "Yeah, we don't give the data that we collect to those other companies, but they're putting your data to work." And then there's data brokers, which are these companies that they scour public records, probably work with credit card companies. Credit card companies monetize your data. Data brokers find ways to get all this data outside of that. There's the companies themselves has their data, and then there's external companies that are just finding inroads and trying to monetize that. CHRISTINE: How are the data brokers ... You've got all this smart home data that has been collected about you. The companies that own the devices or that have made the devices have this data. How are these data brokers accessing your data? JON: I will say I can't state unequivocally that every company works this way. But we spoke with some people at DuckDuckGo, which is a privacy company. They did a sample of the top huge chunk of Android apps that are downloadable. It was in the high 90% of them would have Google Analytics in those apps. That's because Google, Facebook, and other companies like that, they help small companies make apps easier. In doing so, they have their software in there. Even though you may not have an association with Google, you may be using an app that has Google Analytics in it, so they would get some of your data. The idea that even though these companies say your data is anonymous, outside companies, data brokers, they can ... It's a statistical ... CAIRA: They can just piece it all together. JON: They can piece it all together. "It's anonymous. Oh, no, we protected it. We have anonymized data." Well, yes. But once they have an association from here, from here, from here, from here, data brokers specialize in unearthing this information, selling it to the highest bidder, and it gets used for useful purposes. But at the same time, people have been stalked, law enforcement uses these, it's used at the border, ICE uses it. Insurance companies use it. They may or may not decide to cover you. The greater overarching concern here is you don't know what data is collected on you, you don't know how true it is, and you have no access to it. Those are the real problems and there's next to no regulation around any of this. We're just swimming around in this gray area and all of this is happening around us constantly. CHRISTINE: I got to know, why is this legal? Why do we live in a culture and a society where this can happen? ROSIE: Why is this okay? JON: Because cha-ching! (singing) Money, money, money, money. Money! I would say the history of technology and innovation when it comes to the government is the history of the government running frantically behind with a briefcase that has papers flying out of it and a floppy hat. Like, "Wait! We're trying to catch up." There's vested interests that are like, "Oh, we're going to make a lot of money," and people want to protect that. There's also just the wheels of justice move slowly kind of thing. You don't want the government to walk in with a hammer and just slam down and stop innovation. But at the same time, we've struggled to find a system that keeps tabs. The innovations happen so quickly. We're seeing this with AI especially. CAIRA: All right. Well, I don't want to hear any more about it. CHRISTINE: We're done here, we'll see you later. CAIRA: That's a wrap. CHRISTINE: I'm going to go into my bunker. JON: Totally, it's disturbing, it's annoying, et cetera. But the better news is on a state-by-state level, there are legislation. California, always at the fore with this stuff. They have a legislation that has come out where data brokers actually have to register. It isn't hopeless in so much as that states are taking on some of the burden here. There's also a program called the Cyber Trust Mark Program, which is supposed to basically be like a food label type thing, nutrition label, that will be on smart devices in particular. That's actually in the works and is hopeful they will basically say, "It does this, it does this, it meets these standards." Then there's also the example of Europe where they have passed I think it's GDPR. You probably had all those popups every time you access a website that are like, "Do you submit to these cookies," and all that kind of ... That's essentially from that. It's a step forward, it basically holds these companies accountable and allows people to opt out of data collection policies. CAIRA: Okay. All right, to quickly recap, it sounds like it's more than just your smart speaker that's kind of listening to you. It's anything that can be connected to the internet that you probably are using for convenience has the ability to maybe collect some of your data. And there isn't really much oversight on how much they can collect, just from a government level, and it sounds like there isn't much incentive for that to change, but things are maybe moving in the right direction on a state-by-state basis. Also, the way that you are describing these companies collecting data reminds me of when you go to a bookstore and they cover the bookcases in a brown paper bag essentially and they write a description on the bag. They're like, "If this sounds like it's for you, you should buy this book." Companies are doing that to our data and us, "anonymizing" us and selling it to the highest bidder essentially, right? JON: Yeah, I think that's right. CHRISTINE: All right. We're going to take a quick break, and when we're back we're going to get into more details about how specific devices you may own are collecting your data. We're also going to talk about the ways that you can keep these devices from collecting all this data, some safety measures you can take. We'll be right back. ROSIE: Welcome back. Jon, in your article you highlighted three main devices that are perhaps most culpable here. Smart speakers, smart TVs, security cameras, which include video doorbells. Can you briefly explain how each of these devices is collecting your data? Let's start with smart speakers. JON: Sure. With a smart speaker, all of them, there's a setup thing and it's related to an app. You're incorporating these devices into what we call a platform, it's essentially an app. They have your basic stuff. Your name, your home. They might have access to your contacts. They might have access to, depending on the device, it might have access to your photos, if you have one that has a screen on it. If you have sync them up with other devices, which is very common, they They may have access to whatever those devices are collecting. Then on top of that, with a smart speaker, you're talking to it, you're asking it questions. You are- ROSIE: Teaching it. JON: You are teaching it, exactly. Honestly, that's about to get many magnitudes greater because the current versions I'll say, of say Alexa, they learn a little bit, but they don't actually learn in a meaningful way about you. For instance, the new version of Alexa+, it will be learning deeply about you. You can tell it things like, "Oh, hey, Alexa, I'm allergic to gluten, I hate Bob Seeger, and I only drive Fords." It will internalize that and whenever it's answering you or things like that, it will in theory use that in making suggestions. ROSIE: Because of AI, right? JON: Because of AI, yeah. It's generative AI, there's language learning models, and all that kind of stuff. Basically, it's a different way of interacting with these things, they're actually learning. That's what's coming. ROSIE: Can we talk about TVs now? Because I think this is the most mind-blowing thing that I read in your article. It's like when I'm watching Severance, my TV is watching me. How is that happening? JON: It's watching your Innie. ROSIE: Oh, God. JON: No, your Outie. ROSIE: My Outie! JON: Your Outie. It's watching your Outie. Or is it? I don't know. ROSIE: It's both. JON: Yeah. TVs were ... After we did our initial research, Lee Neikirk who covers this, very casually presented this information and everyone's jaw was dropping. Essentially, there's a technology called ACR, automatic content recognition. It's essentially if you've ever used, the most common one is Shazam, or something like that, where you want to identify- CHRISTINE: A song? JON: A song, or something. CHRISTINE: Right. JON: There's technology like that built into most every TV. We don't know of a TV that doesn't have it that's been made in the last few years. ROSIE: It doesn't have to be labeled a smart TV to have this? JON: There's almost no such thing as smart TVs anymore because they're all smart. Any TV that you're going to connect to the internet or anything like that, they almost certainly have automatic content recognition, ACR. What happens is you're watching TV, every couple seconds, the TV's taking functionally what is a screenshot of what is on your screen, sends it up to the internet. It's analyzed, and then it's added to a data profile. Then that is sold, shared, packaged, whatever. Now, if you also have something plugged into your TV, anything that goes on your screen. It's this amazing thing where it recognizes what's there, it's not even what's streaming through your TV. CHRISTINE: Slideshows of your kids or a vacation. JON: Slideshow of your kids, yeah. CHRISTINE: Or of a vacation, or something. JON: Now, I don't know what happens on the far end. They might just be like, "Oh, blobby shapes." It's not identifying your kids or something like that. But also, it's out there and you have no idea of that. The crazy thing is you sort of opt into this almost certainly accidentally. If you buy a new TV and you hit yes, yes, yes, yes, yes, yes, boom, one of those is almost certainly ACR. They might have a branded name for it. Also, if you use a ... Let's say your TV itself isn't connected to the internet, but you have a Google TV, or Roku especially. CAIRA: A Fire Stick? JON: A Fire Stick, those also have ACR. I will say Apple TV is the only company we found of the large companies that does not have ACR built in. CHRISTINE: Well, they already have our phones. CAIRA: They already have all of my data. ROSIE: What more do you need? JON: But if you have your TVs connected to the internet and you plug in Apple TV into it, anything you watch through your Apple TV, the TV is going to see it and the TV is going to send it off. CHRISTINE: Okay. One last device. Let's talk about security cameras. JON: Yeah. CHRISTINE: I don't think it's surprising, these things are meant to watch you. JON: Yes. CHRISTINE: That's the whole point, right? JON: Yes. CHRISTINE: But it is surprising the data that these are collecting. Tell us about that. JON: Yeah. Rachel Cericola reported on this and she referenced the Surfshark data study. Surfshark is a VPN security company. They said that among all the typical smart home devices, security cameras actually collect the most data points. Not volume of data necessarily, but the different types of data. That's because cameras, they're visual. They have temperature sensors. Nest cameras have facial recognition. Some people were like, "Hey, that's very cool. I can go back and search for all the times that-" CAIRA: That thief came to my door. JON: Yes, Mr. Thief. That's my neighbor, Mr. Thief. But you can also understand why someone coming to your door might not want their image on there because who knows what's happening to it online. Again, Nest has a very comprehensive and relatively accessible privacy policy, but there's a lot of gray area in there. There are some states that have proactively banned this because they say, "Oh, you should not have the ability to take someone's face, and label it, and put it on the internet" kind of thing. CAIRA: Yeah, I like that. I like that rule. We should add more of that everywhere. CHRISTINE: Jon, based on the first half of this show, I am now completely terrified and I am not willing to have any smart devices in my home. JON: No! I have failed. I have failed you. CHRISTINE: I am going back to a rotary phone, I am moving to the country. No, in all honesty though, it sounds pretty concerning. I don't want these devices to collect so much about me. But I know that you have a lot of smart devices and clearly you're not getting rid of all of them, you're not going into your bunker. Why do you feel comfortable having all of these smart devices in your home? JON: I think there is absolutely a comfort level thing. I think also, having covered security for a good long while, I feel like for the most part, what happens with smart home is not that very different than what is happening by using the internet and using the smartphone. I do take basic measures that I think are useful and they do limit extreme exposure. But I also understand it's a cost-benefit type thing. One of the security people we spoke with, he talked about security cameras. He's like, "I analyze all these devices, I know what their pitfalls are." He's like "But I feel more protected having security cameras outside than the alternative, not having them." He's like, "I got little kids and I don't like having cameras inside." We cover smart security cameras a lot and we do recommend a bunch of indoor ones, and that you can have them only trigger when there's a pet. You can have them only trigger during certain hours. Some of them, you can have them as you come home, they turn off, when you leave they go back on, that kind of stuff. You shouldn't just be casual about bringing one of these devices into your home. You should really be thoughtful about it. I do think one of the things we've learned is do not buy rando devices that you see, the cheap thing with nine consonants in a row and one vowel named knockoff of some real device. We work really hard on our picks, and we vet them, and we trust them. Trust, but verify. That's our whole thing. I don't think you should be incautious around these at all, but I don't think it's completely justified that these are any worse than your own experiences. CHRISTINE: Right. You need to run a cost-benefit analysis on each device you're using in your life, that's the thing. JON: And just be prudent like you do with everything else. It's like use good passwords. Don't use bad passwords. Don't make it easy. CHRISTINE: Let's talk a little bit about what people can do to protect their data. Let's break it down by the devices that we just talked about. Let's talk through the smart speakers, let's talk through the TVs, and let's talk through the cameras. What are the steps people can do? JON: You interact with smart speakers with your voice, so if you're unhappy with your voice potentially being recorded and sent out there, you can go into any one of the control apps that are associated with the device and you can actually just turn that off. Within the privacy settings, there's a way to do it. I'm not going to describe it for each one right now, but there's the ability to either limit recordings or stop them all together, or you can go back and delete old ones, things like that. That's really basic. You can opt to not have a precise location. You can sometimes just put in a zip code or something like that when you're setting these up. You can also use email addresses that are not your main email address. CHRISTINE: Which we talked a lot about in our episode from a couple months ago with Max Eddy and we'll link that in the show notes. Okay, what about TVs? JON: TVs is more in-depth. Essentially, there is one regulation, yay! That is if you have ACR built into a TV, they are required to make it optional, you are allowed to opt out. They do not make it easy necessarily, but you're going to have to dig in on the settings. Same thing I believe with the Roku and stuff like that, you have to go in and actively turn it off. It may not be called ACR, it may have another name. You can probably go to the support page and they'll guide you through it or something like that. But basically, you're going to go in and turn off one or more, sometimes there's a few settings, it's about data collection, watching habits, other stuff like that. Yeah, turn it off. CHRISTINE: Okay. Let's talk about security cameras. What should people do there? JON: Yeah. I mentioned how one of the people we spoke with was like, "Yeah, I don't have them inside." But what you can do, you can shop for ones that have robust security settings and the ability to turn them on and off. Don't put them in sensitive areas. I know a lot of people, they have kids and they'll have a baby camera. It's not the same thing as a security camera, but those may be going up to the cloud. You may want to have thoughts around that, make sure that the company has really secure data handling practices. I just used one that was audio only. You can opt to not use AI with the cameras. That makes them helpful, but it's an option. CHRISTINE: It sounds like with all of these devices as well, you really want to know that you are buying a product that is from a relatively trustworthy company. Of course, I'm going to plug Wirecutter Picks. If you're listening and wondering, and you want to shortcut it, we've done a lot of research. But if you're going to go out into the wilderness and try to do this on your own, you need to do your homework and just really sort out whether the companies have good data privacy policies. JON: Absolutely. That's what we do is we do your homework for you. CHRISTINE: Right. JON: That's why these are especially important picks because the stakes are so high. CAIRA: Okay. Before we wrap, we always ask our guest one last question. Jon, what's the last thing you bought that you've really loved? JON: The last Wirecutter Pick I bought that I really loved, I will be honest with you, I haven't used it yet. It's actually a Christmas present under the tree, I'm so excited. One of my neighbors bought a Ryobi power washer. I'm going to give you the really eloquent name. ROSIE: That was a wind up to a power washer. JON: I live big. Maybe you've heard of the Ryobi RY 1419MTVNM 1900 PSI electric pressure washer. CHRISTINE: Wow! That does sound like a Christmas present right there. JON: Yeah. I was like, "What is that horrible noise? What is that been going on all day long?" Then I borrowed it and it was like I can't find enough uses for it. It's awesome. CAIRA: Yeah. JON: I'm unreasonably excited to get home and open that thing. CAIRA: Wow! JON: Yes. ROSIE: Jon, I'm so happy for you. I'm grateful that you have joined us today to talk about what has been undoubtedly harrowing, but also incredibly instructive. Thank you so, so much. JON: It is my pleasure. CHRISTINE: Okay, are we all creeped out now by all of the data that everything is collecting about us terrified. ROSIE: Thank you so much for checking in. CHRISTINE: Okay, well, what did you learn today? What are you taking away? ROSIE: I learned a lot. One of the things I'm taking away as soon as possible, Apple TVs are not opting into this automatic ACR. That's that feature that enables smart TVs to screenshot the TV and then send your data right up into the cloud. That makes me wanna buy one. Like today. Another takeaway is to just keep, generally speaking, keep an eye out on AI advances across all of these smart devices and just being more vigilant. CAIRA: Yeah, that's part of my takeaway too. I definitely need to be more diligent about opting out of things. You know, when you download an app, don't let it just take all your data immediately. Also, y'all will never catch me with a smart speaker ever. I think I'm just not going to risk it. How about you, Christine? CHRISTINE: Yeah, you know, I think I now understand why I am getting served certain ads when I didn't Google something. So now I'm understanding a little bit more about how I'm in a network with all of you, and then you're in a networks probably with people I know. So just the web, the matrix basically. And like Rosie, I'm going to switch my streaming stick. I have a Roku streaming stick, the first thing I'm gonna do when I get home is turn the ACR off. And then, yeah, I'm considering an Apple TV now. If I'm in the system, I am in the Apple system. They already have my info, so might as well. Might as well. ROSIE: I'll look online for a promo code. Two for one. There we go. CHRISTINE: There we go. Thank you. Thank you. ROSIE: And if you want to find out more about Wirecutter's coverage, or if you want to check out any of the products Jon recommended today, you can check out our website, you could find a link in our show notes. That's it for us. Have a good week. Bye. ROSIE: The Wirecutter Show is executive produced by Rosie Guerin and produced by Abigail Keel, engineering support from Maddy Masiello and Nick Pitman. Today's episode was mixed by Catherine Anderson. Original music by Dan Powell, Marion Lozano, Elisheba Ittoop, and Diane Wong. Wirecutter's deputy publisher is Cliff Levy. Ben Frumin is Wirecutter's editor-in-chief. I'm Rosie Guerin. CAIRA: I'm Caira Blackwell. CHRISTINE: And I'm Christine Cyr Clisset. ROSIE: Thanks for listening. CHRISTINE: By the way, why is this legal? JON: Sleep well tonight, my children.


Forbes
4 days ago
- Business
- Forbes
The Next Frontier: Why The Guild's Defense Tech Heralds A New Era Of Protection
Editor's Note: This article is a work of fiction. The events, businesses and individuals portrayed below are imaginary. As reports of unprecedented threats to data privacy coincide with concerns about a recent uptick in violence, a relatively unknown military technology company has emerged with a unique value proposition: cutting-edge defense systems adapted for civilian use, counter-surveillance and personal protection. The Guild, a quiet enterprise leader with largely classified dealings in military intelligence and weapons development, is pivoting from shadow operations to the consumer market in part to address what its leaders say is a dire public need for advanced protection. 'In an era where autonomous AI systems protect military bases and robot dogs patrol warzones, The Guild believes its mission is to become a guardian for elder care facilities, campuses and critical infrastructure," explains Victor Thane, The Guild's chief technology officer. In an overt effort to differentiate itself from defense-tech pioneers who blurred the lines between Silicon Valley innovation and government contracting, the Guild expressly promises users that it won't monetize their data for profit. Instead, it aims to adapt, simplify and re-engineer advanced military technology for public benefit as part of a calculated institutional bet: In a collapsing trust economy, security may become the only asset that truly appreciates. 'Our job was never to be known,' says Thane. 'It was to be ready.' Featured on the cover of the latest issue of Forbes is a visceral representation of that readiness: A humanoid bipedal robot with advanced capabilities, made for consumers. It's just one in The Guild's sophisticated product suite engineered to protect everyday civilians with military-grade technology. Founded by former defense strategists and robotics engineers, The Guild spent over a decade building AI and autonomous systems for government and security clients. Many of its early deployments remain classified, but its work reportedly spanned counter-threat modeling, reconnaissance robotics and mission-critical intelligence systems. Now, that expertise has been funneled into a line of adaptive defense products designed both for institutional use and practical domestic applications, like property monitoring. The suite includes: Underpinning these hardware systems is an advanced intelligence platform that learns user environments and routines to perform predictive analysis. Every product is designed to detect anomalies, from irregular movement patterns to hostile micro-expressions, and act on that information autonomously. What may distinguish The Guild as much as its technology is its business philosophy. The company operates on a subscription-based model that, notably, does not monetize user data. Customers pay only for access to the hardware and software. In an age of surveillance capitalism, The Guild's model is deliberately simple: empowerment through intelligent defense. 'We don't collect what we don't need,' says Thane. 'Our systems are designed to safeguard, not surveil. Our mission is to protect when conventional safeguards fall short.' Whether The Guild's vision of public defense becomes mainstream or remains niche, its people-first objective promises to usher in a new era of demand for technology enterprises to define their responsibility to users. The rise of AI-powered consumer protection systems raises broader questions about ethics, autonomy and the line between preparedness and intrusion. But, The Guild's founders argue that shift is inevitable. 'Today's threats don't look like yesterday's,' says Crane. 'And tomorrow's won't wait for consent. Our job is to keep people safe when the rules don't apply.' The Guild is already seeing a surge in adoption by hospitals, private investors, former defense officials and global security firms all looking for adaptive, learning-based systems of protection. With plans to expand into strategic civilian infrastructure, global private partnerships, and next-gen AI protective systems, The Guild is poised to rewrite what private defense means for public protection.


Forbes
7 days ago
- Forbes
Do Not Keep These ‘High Risk' Apps On Your iPhone Or Android
While TikTok has generated the most headlines when it comes to allegations of your data being secretly sent to China, it turns out that a much bigger threat could have been been hiding on your phone all this time. And this one is much more dangerous. It has taken a spate of porn bans — first in the U.S. and now in Europe to flush out this risk. As much as smartphone users need their TikTok fix, porn is an even bigger draw. And tens of millions of users are suddenly masking their internet traffic for the first time, pretending to be somewhere they are not to bypass those bans. This is done by way of virtual private networks or VPNs. The same technology that failed to circumvent TikTok's short-lived U.S. ban in January. But for porn, VPNs work just fine. vpnMentor saw a 'staggering' 6,000% surge in U.K VPN use after restrictions came into effect. The same explosive growth seen in the U.S. and France. Many of the installed VPNs were free apps topping App Store and Play Store charts. But many of these have a nasty, hidden secret. As Top10VPN's Simon Migliano warns, "despite being made aware of glaring privacy failures and opaque corporate structures, Google and Apple continue to permit these high-risk apps on their platforms.' A month ago, the Tech Transparency Project (TTP) issued a report into free VPNs, warning that 'millions of Americans have downloaded apps that secretly route their internet traffic through Chinese companies.' It reported on this same threat in April. 'Apple and Google app stores continue to offer private browsing apps that are surreptitiously owned by Chinese companies… six weeks after they were identified.' 'In light of these findings," Migliano warns, "I strongly urge users to avoid Chinese-owned VPNs altogether." He says 'the risks are too great' to keep them on your phone. As BeyondTrust's James Maude told me 'if you aren't paying for a product, you are the product. These VPNs are a perfect example of the hidden costs of free apps where users seeking privacy are potentially unknowingly feeding data to a foreign nation state." Google told me it is "committed to compliance with applicable sanctions and trade compliance laws. When we locate accounts that may violate these laws, our related policies or Terms of Service, we take appropriate action.' While Apple says it enforces App Store rules but does not differentiate its handling of apps by the location of their developers, albeit VPNs are prohibited from sharing data. My advice is to open either the App Store on your iPhone or the Play Store on your Android, and then search for 'free VPN.' You should delete any apps listed as installed on your phone that highlight that 'free VPN" tag, unless they are linked to blue-chip, western technology firms that provide other security offerings. Meanwhile, here's the TTP list of Chinese apps you should search for: Apple App Store: Google Play Store:
Yahoo
01-08-2025
- Business
- Yahoo
Sovereign washing in the context of public cloud services
The geopolitical turmoil of recent years, trade nationalism, and all manner of security concerns have driven demand for digital sovereignty solutions, and the market growth for sovereign cloud services has been exponential, with enterprises trying to safeguard their valuable customer data. However, while most cloud computing providers offering these sovereign solutions are based in the US, it is European companies that are demanding these types of solutions. European organisations are among the most respectful of data privacy on earth, with legislation designed to protect consumer data against powerful, profit-driven multinationals, particularly in specific industry sectors such as healthcare and life sciences. As a result, a new term has been coined: 'sovereign washing'. Demand for digital transformation projects including local access to GenAI solutions is growing. The public cloud has helped businesses improve their performance and has many benefits. But in these uncertain times, special care must be taken when it comes to sensitive data. Like in other instances of 'washing', the term is used to describe the way many large tech companies promote their offerings as 'sovereign cloud', 'digital sovereignty', and so on for marketing purposes when in fact, they are not actually fulfilling demands regarding true data control and security. The use of terms such as 'local', 'sovereign', 'compliance', and 'national' are used as throwaway lines in PR collateral, but they have no real meaning. The sovereign washing trend The trend of sovereign washing became most apparent when Anton Carniaux, Microsoft France's director of public and legal affairs, declared under oath during a French Senate inquiry into the role of public procurement in promoting digital sovereignty, that he could not guarantee that French citizen data would never be transmitted to US authorities without explicit French authorisation. It then became apparent that US legislation, specifically the US Cloud Act, means that companies headquartered in the US can be made to hand over data to the US authorities regardless of where the data is stored. When US based hyperscalers such as Microsoft Azure, AWS and Google Cloud claim to offer digital sovereignty solutions, their services remain subject to US law. Sometimes, these companies partner with local companies to try to find a compromise. The term emphasises the growing importance of undertaking due diligence when it comes to claims of 'digital sovereignty' solutions to find out whether companies are truly committed to the principles of data autonomy and control. "Sovereign washing in the context of public cloud services" was originally created and published by Verdict, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data